Authors: Jozef Barunik and Michael Ellington
Abstract: We propose new measures to characterize dynamic network connections in large financial and economic systems. In doing so, our measures allow one to describe and understand causal network structures that evolve throughout time and over horizons using variance decomposition matrices from time-varying parameter VAR (TVP VAR) models. These methods allow researchers and practitioners to examine network connections over any horizon of interest whilst also being applicable to a wide range of economic and financial data. Our empirical application redefines the meaning of big in big data, in the context of TVP VAR models, and track dynamic connections among illiquidity ratios of all S&P500 constituents. We then study the information content of these measures for the market return and real economy.
Jozef Barunik Associate Professor, Academy of Sciences and Charles University
Jozef Baruník is an Associate Professor at the Institute of Economic Studies, Charles University in Prague. He also serves as a head of the Econometrics department at the Czech Academy of Sciences. In his research, he develops mathematical models for understanding financial problems (such as measuring and managing financial risk), develops statistical methods and analyzes financial data. Especially, he is interested in asset pricing, high-frequency data, financial econometrics, machine learning, high-dimensional financial data sets (big data), and frequency domain econometrics (cyclical properties and behavior of economic variables).
Authors: Zuzana Praskova
Abstract: TBA
Zuzana Prášková Professor, Charles University
Authors: Ben Amor, Souhir and Michael Althoff
Abstract: Emerging Markets (EM) have become an attractive investment target due to the fast growing economic and the increasing of liquidity and transparency. However, these markets are characterised by a high level of volatility due to external price shocks, and domestic policy instability. Therefore, an efficient risk measure and hedging strategy are needed to help the market players to hedge their investments against this risk. In this paper, a daily systemic risk measure, called FRM (Financial Risk Meter) is proposed. The FRM is based on Lasso quantile regression designed to capture tail event co-movements. In practice, FRM is applied to the return time series of selected financial institutions and macroeconomic risk factors. Using FRM, we identify companies exhibiting extreme ''co-stress'', as well as ''activators'' of stress. Therefore, FRM Emerging Markets is applied to capture systemic risk behaviour embedded in the returns of the 25 largest Emerging Markets financial institutions, covering the BRIMST (Brazil, Russia, India, Mexico, South Africa, Turkey), and thereby reflects the financial linkages between these economies. Concerning the macroeconomic risk factors, in addition to the Adrian and Brunnermaier (2016) macroeconomic risk factors, we used the Emerging Market yield spread over respective US Treasuries, and the above mentioned countries' currencies. Therefore, we studied the relationship between those factors and tail event network behaviour to build our policy recommendations in order to help the investors to choose the suitable market for investment and then to optimise their portfolio. The results indicated that Emerging Market financial institutions were as much impacted by the crisis from mid-March 2020 onwards.
Souhir Ben Amor Guest Researcher, IHEC, Sousse, TN
Authors: Michal Pesta
Abstract: How to deal with multivariate dependent non-negative time series? In particular, when some or all of the time series contain non-negligible portion of zeros whereas the remaining observations are positive. Classical modeling techniques surprisingly fail. The proposed method is illustrated on estimation of the dependency structure among several actuarial lines of business, which is now a front burner in insurance industry.
Authors: Sebastiano Vitali
Abstract: A pension fund manager typically decides the allocation of the pension fund assets looking for a long-term sustainability. Many Asset and Liability Management models in the form of multistage stochastic programming problem have been proposed to help the pension fund manager to define the optimal allocation given a multi-objective function. The recent literature proposes multistage stochastic dominance constraints to guarantee that the optimal strategy is able to stochastically dominate a benchmark portfolio. In this work, we extend previous results to another type of stochastic dominance that appears more consistent to use in a multistage framework. Indeed, instead of considering multiple single-stage stochastic dominance constraints, we apply a unique constraint that involves jointly multiple stages. Numerical results show the dissimilarities between the different ways to interpret and apply the multistage stochastic dominance.
Authors: Ruth Dominguez
Abstract: To attain the net zero emission target by 2050 different energy strategies can be implemented. The use of stochastic dominance constraints allows to compare the results of each strategy from different perspectives. Therefore, in this work we carry out an analysis of the efficiency of the possible strategies to be followed to attain the emissions targets established by the European Commission for 2050. We present a multistage investment model in generating and storage capacity in which long-term uncertainties, such as the demand growth and the investment and fuel costs are considered, adopting the point of view of a central planner. The variability of the demand and the renewable production in the daily operation is taken into account by including a set of representative days of the year. Second order stochastic dominance constraints are applied to select better strategies under different planning scenarios. A numerical study based on the European power system is explained.
Ruth Dominguez Assitant Professor, University of Castilla - La Mancha
Authors: Lubos Hanus and Jozef Barunik
Abstract: The use of machine learning techniques is proposed to describe and forecast the conditional probability distribution of asset returns. We redefine the problem of forecasting of conditional probabilities looking from a different perspective than traditional ordered binary choice models. Using deep learning methods, we offer a better description of asset returns distribution. The study on the most liquid U.S. stocks shows that predictive performance of machine learning methods is promising out-of-sample. We provide a comparison of machine learning methods to the unordered and order binary choice models used by the literature.
Authors: Martin Hronec and Jozef Barunik
Abstract: A large scale empirical test is performed for an asset pricing model based on agents with quantile utility preferences instead of the standard expected utility. Using machine learning methods, we predict quantiles of individual stock returns obtaining the whole forecasted distributions. We document heterogeneity in models parameters across different quantiles. We show that forecasting all quantiles together, using multi-task deep learning is better than forecasting quantiles individually. The forecasting models allow us to construct portfolios based on the whole distribution instead of just a conditional mean. We show the economic value added of looking at the whole forecasted distribution by forming quantile-based long-short portfolios, as well as favourably forecasting value-at-risk.
Authors: Michael Althoff
Abstract: TBA
Michael Althoff PhD student, PIMCO Munich Allianz
Authors: Danial Saef
Abstract: Jumps in financial time series, especially if they are in high frequency, are difficult to detect. We evaluate the applicability of existing jump test methodologies as in Lee and Mykland (2012), and Ait-Sahalia and Jacod (2012) on a diverse cryptocurrency tick dataset. Cryptocurrency markets are highly capitalised, yet they differ from traditional markets due to non-stop trading, lower volume and high correlation among currencies. These properties cause larger volatility and make them more vulnerable to contagion effects. Existing jump test methodologies are difficult to apply to cryptocurrencies as they do not address these market specific properties. The evaluation shows that the number of jumps per day differs according to the chosen method. However, both frameworks agree that the frequency of jumps is time-varying and subject to clustering effects. We conclude that existing methodologies should be extended to account for contagion effects and clustered jumps, and provide suggestions on how to extend the existing methodologies accordingly.
Authors: Bingling Wang, Yinxing Li, Wolfgang Karl Härdle
Abstract: K-means clustering is one of the most widely-used partitioning algorithm in cluster analysis due to its simplicity and computational efficiency. However, K-means does not provide an appropriate clustering result when applying to data with non-spherical shaped clusters. We propose a novel partitioning clustering algorithm based on expectiles. The cluster centers are defined as multivariate expectiles and clusters are searched via a greedy algorithm by minimizing the within cluster '$ au$ -variance'. We suggest two schemes: fixed $ au$ clustering, and adaptive $ au$ clustering. Validated by simulation results, this method beats both K-means and spectral clustering on data with asymmetric shaped clusters, or clusters with a complicated structure, including asymmetric normal, beta and F distributed clusters and the mixture of several different distributions. An application of adaptive $ au$ clustering on crypto-currency market and fixed $ au$ clustering on stock market data are provided. Evidence has found that the expectiles clusters of CC market show the phenomena of positive correlated price and volatilities, which indicate a retail investors dominated market.
Bingling Wang PhD student, Humboldt-Universität zu Berlin
Authors: Xinwen Ni, Wolfgang K. Härdle
Abstract: Despite regulators’ best intentions to protect consumers and to avoid money launder- ing, terrorist finance, or other illicit activities, cryptocurrencies’ values remain heavily tied to drastic policy changes, as several prominent examples revealed over the past years. Indices have been constructed to track the Cryptocurrency markets, however, none of these indices directly address regulatory risks. In this paper, we aim to quantify the risks originating from introducing regulations on the cryptocurrency markets and identify their impact on the cryptocurrency investments. conduct a regulatory risk index for the cryptocurrency market base on the policy-related news coverage frequency. The news data are collected from the top online cryptocurrency news platforms. A LDA model was trained to obtain the topics and was applied to calculate the distance between news to solve the problem of semantic similarity as a binary decision problem in which an article is policy related or not. Our regulatory risk index successfully captures those big policy changing moments. The movements for both VCRIX, a market volatility index, and the regulatory risk index are synchronous, meaning that the index could be helpful for all participants in the cryptocurrency market. The algorithms and Python code are available for research purposes on www.quantlet.de. .
Authors: Karel Kozmik
Abstract: TBA
Karel Kozmik PhD student, Charles University